Goto

Collaborating Authors

 emotion ai


Reinforcing Stereotypes of Anger: Emotion AI on African American Vernacular English

Dorn, Rebecca, Chance, Christina, Rusti, Casandra, Bickham, Charles Jr., Chang, Kai-Wei, Morstatter, Fred, Lerman, Kristina

arXiv.org Artificial Intelligence

Automated emotion detection is widely used in applications ranging from well-being monitoring to high-stakes domains like mental health and hiring. However, models often rely on annotations that reflect dominant cultural norms, limiting model ability to recognize emotional expression in dialects often excluded from training data distributions, such as African American Vernacular English (AAVE). This study examines emotion recognition model performance on AAVE compared to General American English (GAE). We analyze 2.7 million tweets geo-tagged within Los Angeles. Texts are scored for strength of AAVE using computational approximations of dialect features. Annotations of emotion presence and intensity are collected on a dataset of 875 tweets with both high and low AAVE densities. To assess model accuracy on a task as subjective as emotion perception, we calculate community-informed "silver" labels where AAVE-dense tweets are labeled by African American, AAVE-fluent (ingroup) annotators. On our labeled sample, GPT and BERT-based models exhibit false positive prediction rates of anger on AAVE more than double than on GAE. SpanEmo, a popular text-based emotion model, increases false positive rates of anger from 25 percent on GAE to 60 percent on AAVE. Additionally, a series of linear regressions reveals that models and non-ingroup annotations are significantly more correlated with profanity-based AAVE features than ingroup annotations. Linking Census tract demographics, we observe that neighborhoods with higher proportions of African American residents are associated with higher predictions of anger (Pearson's correlation r = 0.27) and lower joy (r = -0.10). These results find an emergent safety issue of emotion AI reinforcing racial stereotypes through biased emotion classification. We emphasize the need for culturally and dialect-informed affective computing systems.


what-is-influence-engineering-how-it-relates-to-emotion-ai

#artificialintelligence

The availability of vast data sources and advanced machine learning technologies has given rise to a new system of influence known as influence engineering. It can guide user behavior and lead to new customer acquisition. Using computer vision and pattern analysis techniques, companies can now recognize user emotions using emotion detection techniques (generally called emotion AI) to direct their decision-making process. Also, the advancements in emotion detection and natural language processing techniques present a significant opportunity to automate influential aspects of consumer communication and digital marketing. In fact, in 2021, Gartner named influence engineering as one of the six emerging technologies expected to drive growth for digital marketing.


AI in Healthcare: Upcoming trends in 2023

#artificialintelligence

Advancements in today's world are a result of interdisciplinary research across various fields of science and technology, and Artificial intelligence (AI) is playing a significant role in shaping today's research. The power of AI in healthcare lies in its ability to process and make sense of an enormous amount of data in a considerably short amount of time. The use of AI in healthcare has various applications ranging from getting a better understanding of human biology at a genetic level to maintaining patient data to increase the hospital's functioning for efficiently understanding a person's emotional state. AI is reforming and shaping modern healthcare through algorithms and devices that can forecast, assimilate and take the required action. AI is actively being used across various areas of healthcare such as medical imaging, wearable sensors, guiding surgical robots, etc.


Council Post: Emotion AI: Why It's The Future Of Digital Health

#artificialintelligence

Have you ever heard of emotion artificial intelligence (AI)? Emotion AI, or affective AI, is a field of computer science that helps machines gain an understanding of human emotions. The MIT Media Lab and Dr. Rosalind Picard are the premier innovators in this space. Through their work, they sparked the idea to help machines develop empathy. Empathy is a complex concept with a lot of strings attached to it, but on a basic level, it means having an understanding of another person's emotional states.


Emotion AI: Can artificial intelligence really read humans? - disruptor.news

#artificialintelligence

The human questioner may not be sure "I'm good" is a fact. But artificial intelligence (AI) and machine learning (ML) engineers claim that new technologies known as "emotion AI" can observe people and accurately assess how they're feeling. AI is all around us, whether we know it or not. It enables mainstream social media platforms to pitch smart personalization; virtual healthcare assistants to help nurses with burnout prevention; integrated smart assistants in electronic devices to perform various tasks, and much more. Artificial emotional intelligence systems go further.


Gartner research: 2 types of emerging AI near hype cycle peak

#artificialintelligence

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! According to new Gartner research, two types of emerging artificial intelligence (AI) -- emotion and generative AI -- are both reaching the peak of the digital advertising hype cycle. This is thanks to AI's expansion into targeting, measurement, identity resolution and even generating creative content. "I think one of the key pieces is that the options for marketers have been accelerating," Mike Froggatt, senior director analyst in the Gartner marketing practice, told VentureBeat.


Gartner research: 2 types of emerging AI near hype cycle peak

#artificialintelligence

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! According to new Gartner research, two types of emerging artificial intelligence (AI) -- emotion and generative AI -- are both reaching the peak of the digital advertising hype cycle. This is thanks to AI's expansion into targeting, measurement, identity resolution and even generating creative content. "I think one of the key pieces is that the options for marketers have been accelerating," Mike Froggatt, senior director analyst in the Gartner marketing practice, told VentureBeat.


The Promise and Potential Impact of Emotion AI on the Future of Work

#artificialintelligence

Emotion AI has been described by some as the future of artificial intelligence. Also known as affective computing or artificial emotional intelligence, the field uses AI to study the non-verbal cues of humans, like body language, facial expressions, gestures and tone of voice to detect their emotional state. The somewhat controversial field has experienced an explosion of development over the past year. According to some players in the space, it has the potential to impact the future of work and influence a number of industries as development continues during the next few years. "Within the next five years, you'll see some really amazing experiences come out," Rana Gujral, CEO of Behavioral Signals, told Builtin.


Emotion AI analyzes facial expressions to guess future attitudes - Dataconomy

#artificialintelligence

More and more businesses are moving into an era wherein artificial intelligence (AI) is a component of every new initiative. One such tool called emotion AI analyzes facial expressions based on a person's faceprint to find their goals, attitudes, and interior emotions. The "basic emotions" theory, which asserts that people all over the world express the same six basic internal emotional states (happiness, surprise, fear, disgust, anger, and sadness) through their facial expressions, which are influenced by our biological and evolutionary origins, is the foundation of this application, which is also known as emotion AI or affective computing. This idea sounds logical on the surface because nonverbal communication heavily relies on facial expressions. Emotion AI is an emerging technology that "allows a computer and systems to identify, process, and simulate human feelings and emotions," according to a recent report by tech industry research firm AIMultiple.


Emotion AI: A possible path to thought policing

#artificialintelligence

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. A recent VentureBeat article referenced Gartner analyst Whit Andrews saying that more and more companies are entering an era where artificial intelligence (AI) is an aspect of every new project. One such AI application uses facial recognition to analyze expressions based on a person's faceprint to detect their internal emotions or feelings, motivations and attitudes. Known as emotion AI or affective computing, this application is based on the theory of "basic emotions" [$], which states that people everywhere communicate six basic internal emotional states -- happiness, surprise, fear, disgust, anger and sadness -- using the same facial movements based on our biological and evolutionary origins. On the surface, this assumption seems reasonable, as facial expressions are an essential aspect of nonverbal communications.